Nonparametric entropy estimation : an overview ∗
نویسنده
چکیده
We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper ([55]). Since then, entropy has been of great theoretical and applied interest. The basic properties ∗This research was supported by the Scientific Exchange Program between the Belgian Academy of Sciences and the Hungarian Academy of Sciences in the field of Mathematical Information Theory, and NATO Research Grant No. CRG 931030.
منابع مشابه
An Assessment of Hermite Function Based Approximations of Mutual Information Applied to Independent Component Analysis
At the heart of many ICA techniques is a nonparametric estimate of an information measure, usually via nonparametric density estimation, for example, kernel density estimation. While not as popular as kernel density estimators, orthogonal functions can be used for nonparametric density estimation (via a truncated series expansion whose coefficients are calculated from the observed data). While ...
متن کاملShape Constrained Density Estimation via Penalized Rényi Divergence
Abstract. Shape constraints play an increasingly prominent role in nonparametric function estimation. While considerable recent attention has been focused on log concavity as a regularizing device in nonparametric density estimation, weaker forms of concavity constraints encompassing larger classes of densities have received less attention but offer some additional flexibility. Heavier tail beh...
متن کاملThe finite sample performance of semi- and non-parametric estimators for treatment effects and policy evaluation
The Finite Sample Performance of Semiand Nonparametric Estimators for Treatment Effects and Policy Evaluation This paper investigates the finite sample performance of a comprehensive set of semiand nonparametric estimators for treatment and policy evaluation. In contrast to previous simulation studies which mostly considered semiparametric approaches relying on parametric propensity score estim...
متن کاملPerformance comparison of new nonparametric independent component analysis algorithm for different entropic indexes
Most independent component analysis (ICA) algorithms use mutual information (MI) measures based on Shannon entropy as a cost function, but Shannon entropy is not the only measure in the literature. In this paper, instead of Shannon entropy, Tsallis entropy is used and a novel ICA algorithm, which uses kernel density estimation (KDE) for estimation of source distributions, is proposed. KDE is di...
متن کاملQuasi-continuous maximum entropy distribution approximation with kernel density
This paper extends maximum entropy estimation of discrete probability distributions to the continuous case. This transition leads to a nonparametric estimation of a probability density function, preserving the maximum entropy principle. Furthermore, the derived density estimate provides a minimum mean integrated square error. In a second step it is shown, how boundary conditions can be included...
متن کامل